Langevin-Type Models II: Self-Targeting Candidates for MCMC Algorithms*
نویسنده
چکیده
The Metropolis-Hastings algorithm for estimating a distribution p is based on choosing a candidate Markov chain and then accepting or rejecting moves of the candidate to produce a chain known to have p as the invariant measure. The traditional methods use candidates essentially unconnected to p. We show that the class of candidate distributions, developed in Part I (Stramer and Tweedie 1999), which ``self-target'' towards the high density areas of p, produce Metropolis-Hastings algorithms with convergence rates that appear to be considerably better than those known for the traditional candidate choices, such as random walk. We illustrate this behavior for examples with exponential and polynomial tails, and for a logistic regression model using a Gibbs sampling algorithm. The detailed results are given in one dimension but we indicate how they may extend successfully to higher dimensions.
منابع مشابه
Exploring an Adaptive Metropolis Algorithm
While adaptive methods for MCMC are under active development, their utility has been under-recognized. We briefly review some theoretical results relevant to adaptive MCMC. We then suggest a very simple and effective algorithm to adapt proposal densities for random walk Metropolis and Metropolis adjusted Langevin algorithms. The benefits of this algorithm are immediate, and we demonstrate its p...
متن کاملLimit Theorems for Some Adaptive Mcmc Algorithms with Subgeometric Kernels: Part Ii
We prove a central limit theorem for a general class of adaptive Markov Chain Monte Carlo algorithms driven by sub-geometrically ergodic Markov kernels. We discuss in detail the special case of stochastic approximation. We use the result to analyze the asymptotic behavior of an adaptive version of the Metropolis Adjusted Langevin algorithm with a heavy tailed target density.
متن کاملEfficient uncertainty quantification techniques in inverse problems for Richards’ equation using coarse-scale simulation models
0309-1708/$ see front matter 2008 Elsevier Ltd. A doi:10.1016/j.advwatres.2008.11.009 * Corresponding author. E-mail address: [email protected] (P. Dost This paper concerns efficient uncertainty quantification techniques in inverse problems for Richards’ equation which use coarse-scale simulation models. We consider the problem of determining saturated hydraulic conductivity fields condi...
متن کاملGradient-based MCMC samplers for dynamic causal modelling
In this technical note, we derive two MCMC (Markov chain Monte Carlo) samplers for dynamic causal models (DCMs). Specifically, we use (a) Hamiltonian MCMC (HMC-E) where sampling is simulated using Hamilton's equation of motion and (b) Langevin Monte Carlo algorithm (LMC-R and LMC-E) that simulates the Langevin diffusion of samples using gradients either on a Euclidean (E) or on a Riemannian (R)...
متن کاملScalable MCMC for Mixed Membership Stochastic Blockmodels
We propose a stochastic gradient Markov chain Monte Carlo (SG-MCMC) algorithm for scalable inference in mixed-membership stochastic blockmodels (MMSB). Our algorithm is based on the stochastic gradient Riemannian Langevin sampler and achieves both faster speed and higher accuracy at every iteration than the current state-of-the-art algorithm based on stochastic variational inference. In additio...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 1999